Versions:
CLI Proxy API version 6.9.4, published by Luis Pater, is a lightweight proxy server that exposes command-line interfaces for Google Gemini, OpenAI Codex and Anthropic Claude as unified OpenAI-compatible REST endpoints, enabling developers to embed the capabilities of Gemini 2.5 Pro, GPT-5-class models and Claude Code into any application that speaks the OpenAI SDK or standard HTTP libraries. Acting as a network shim, the program listens on a local port, translates incoming ChatGPT-style requests into the native authentication flows of each backend, and returns completions, chat or code-generation responses in a single consistent JSON schema; because traffic originates from the user’s own machine, the utility supports unlimited local or multi-account scenarios without extra cloud cost. Typical use cases include wiring free-tier Gemini or GPT resources into IDE extensions, automated test pipelines, continuous-integration scripts, chatbot prototypes, batch data-labeling jobs, or classroom coding exercises that require a drop-in replacement for commercial OpenAI endpoints. The same binary also provides OAuth-handling for Claude and, as of the latest build, integrates China’s Qwen Code model, expanding the roster of accessible engines to four distinct families. With 453 incremental releases tracked since inception, the project iterates rapidly to keep pace with provider API changes while remaining portable across Windows, macOS and Linux. CLI Proxy API is available for free on get.nero.com, where downloads are supplied through trusted Windows package sources such as winget, always serving the newest build and optionally supporting batch installation alongside other applications.
Tags: